From:                              route@monster.com

Sent:                               Tuesday, June 04, 2013 3:54 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Big Data

 

This resume has been forwarded to you at the request of Monster User xapeix01

Rajesh Choudhary 

Last updated:  04/30/13

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Stamford  06901
US

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Rajesh Choudhary - Solutions Architect

Resume Value: 2vrpwwacz6pseydy   

  

 

Rajesh Choudhary

Skyrajesh69@gmail.com

+1 203-822-4690

 

Professional Synopsis

·         Solution Architect with around 8 years of demonstrated expertise in Business Development and designing, implementing and managing BI/EDW/Hadoop projects.

 

Areas of Expertise             

·         BI/Hadoop/Big data solution architecting and design.

·         Road Map consulting for BI and Big data.

·         Pre-sales support - responding to RFP’s, Presentation and Negotiations with clients

·         Solution Architecting skills in the Enterprise data warehousing

·         Developing Methodologies and Standards, BI tools evaluation and Data Warehousing

·         Requirements gathering, business analysis, technical facilitation              

 

Key Skills             

·         Big Data Architect (Hadoop, Splunk, Map/Reduce, Pig, Hive, Hbase, MongoDb)

·         Worked in BI delivery projects in ETL and reporting.

·         Training and administration of BI tools.

·         Delivery Experience of Hortonworks, SAS EBI, Teradata, Oracle etc.

 

Proficiency Matrix

 

Pre Sales – Business Development

·         Big Data, Splunk, SAS architecture

·         Tool Stack selection, BI comparative study.

·         BI architecture design and project planning, estimation and sizing.

·         Presented and defended solutions at large customers.

·         Business Analysis and Solution design, Solution writing and project Estimation.

 

Functional -Data Warehousing/Big Data

·         Big Data Architecture review for POC’s

·         Handling the design, development, testing and deployment of data warehouse systems.

·         Deliver a project scope that directly supports the key business drivers

·         Define the overall data warehouse architecture (e.g., ETL process, ODS, EDW, Data Marts)

·         Define technical requirements, technical and data architectures for the data warehouse

·         Recommend/Identified data warehouse technologies (e.g., ETL, DBMS, Data Quality, BI)

·         Design and direct the ETL process, including data quality and testing

·         Define meta data standards for the data warehouse

·         Feasibility study and understanding the design and business requirements.

 

Technical – BI/Hadoop Delivery

·         Deployment and administration of Splunk and Horton-works Distribution.

·         Defining and working with Index stores in Splunk.

·         Worked with Splunk in Pre and Post Hadoop architecture.

·         Report builder and query in Splunk

·         Hadoop based architecture and administration.

·         Deployed Horton works Distribution and architecture on Amazon EC2

·         Worked on Map reduce, Pig, Hive. Using Terasort, Namenode management and recovery.

 

Technical

Data Warehousing                           : Teradata, Netezza, Exadata

ETL                                                         : SAS DI, informatica, Ab Initio

Reporting                                           : Tableau, cognos, SAS EBI Suite.

Statistical Prog                                          : R, SAS.

Programming Languages              : C++, Java, SQL, Base SAS, Macros, Shell

Apache Hadoop                                          : HDFS, Hbase, Hive, Mahout, Pig, MapReduce.             

Hadoop Solutions                     : Horton works, Cloudera, MapR, IBM Big Insight, Oracle Big data.

NoSQL DB with MR                            : Cassandra, MongoDB, Couch DB, Riak, Neo4j, Accumulo.

RDBMS with MR                            : Greenplum, Vertica, Aster, Impala.

Reporting with MR                            : Splunk, Datameer, Paraccel.

Cloud                                                         : Amazon EC2 Mapreduce, M7.

Domain                                           : Life Sciences, Finance, Banking and Insurance, Telecom, Manufacturing, power.

 

Education:

Graduation             

Bachelor in Computer Applications             

Indira Gandhi National University, New Delhi

 

PG Executive program in Management

Executive PGP in International Business             

Indian Institute of Management, Lucknow

 

Professional Certifications and webinars

·         Big data Architect Training conducted by Horton works

·         SAS administration by SAS Institute

·         SAS EBI by SAS Institute

·         Asset Liability and management by SAS Institute.

·         Life Science domain by Cognizant.

·         Solution Architecture training by Mahindra Satyam

·         Understanding MapR architecture by MapR

·         Greenplum Architecture

·         Datastax overview

·         Splunk architecture.

·         M7 overview by Google.

 

Management Trainings

·         Strategic Management from IIM Lucknow.

·         Macro Economics from University of Mumbai.

·         International Trade from University of Mumbai.

·         Industrial economics from University of Mumbai.

 

Professional Experience:

 

Since Feb’07 with Mahindra Satyam

 

BI/Hadoop Architect GE Capital, Stamford                                                                            Jan 2012 to March 2013

Technical Environment: SAS, Splunk, Hortonworks.

 

Responsibilities:

·         Hadoop based architecture remodelling for one reporting stream.

·         Deployment and administration of Horton works distribution.

·         Working with SAS as using HIVE.

·         Deployment and administration of Splunk and Horton-works Distribution.

·         Defining and working with Index stores in Splunk

·         Worked with Splunk in Pre and Post Hadoop architecture

·         Report builder and query in Splunk

·         Namenode management and console setup.

·         Working with Amazon EC2.

·         Mongo DB based collection loading and document creation.

·         Ensuring that all the Requirements are covered in SAS design, coding, testing 

·         Guiding the team in impact assessment, solution definition, build, test and implementation. 

 

BI Solutions Architect Mumbai                                                                                                            April 2010 to Dec 2012

 

Work Description

Architected the solutions for RFP request in coming up with winning responses. Co-ordinated with various BI/DW tool owners and built the solution.  The major activities involve:

·         Working with BI/DW and Hadoop based architecture.

·         Apache Hadoop architecuture and configuration of multinode setup.

·         Understanding client requirements & Architecting the solution

·         Evaluating tools & technologies in open source and open DB.

·         Preparing & Evaluating estimations

·         Presentations to Prospects

·         Lead & Guide POC teams

·         Coordinate with various teams like Sales, Finance, Delivery etc.

 

BI Lead, MNP Vodafone, Pune                                                                                                       Nov 09 – March 2010

 

Technical Environment: Oracle, SAS EBI, UNIX

 

Responsibilities:

·         Analysis and requirement gathering.

·         Creating fresh views and ETL jobs.

·         Development of Info Maps.

·         Adding new dimensions in SAS WRS.

·         Testing of all new dimensions and reports.

 

ETL Lead, Suncorp-Metway, Brisbane Australia                                                                             Feb 08 to April 09

 

Technical Environment: Database - Teradata, SQL Assistant,

Scope: SAS DI studio was used to implement the ETL process of DW. Data was loaded from Legacy to Base Layer of the DW.

 

Responsibilities:

·         Feasibility study and understanding the design and business requirements.

·         Design and develop the ETL job logic and jobs.

·         Implement SAS rid with LSF

·         Performance testing and UAT of the data.

·         Designing schedules and deploying them to server.

·         Project management and planning.

 

SAS Lead, ANZ IT , Bangalore                                                                                                         Feb 07 to Jan 2008

 

Technical Environment: SAS ETL/ Studio, SMC, OLAP Cube studio, SAS EG, SAS Web report Studio, SAS IDP.

 

Scope:

·         Balanced Scorecard being implemented for the Operations division of ANZ bank.

 

Responsibilities:

·         Design and analysis of Balanced Scorecard deployment.

·         Using SAS ETL Studio and SAS OLAP Cube studio and SAS Portal for the development.

·         Development of all reports and moving them to production after UAT.

·         Migrated SAS Codes and Batches from V8 to V9.

 

Cognizant Technology Solutions                                                                                                                      July 05- Jan 07

 

Programmer Analyst, Pfizer CDM Mumbai                                                                                                July2005 – Jan 2007

Technical Environment: SAS, Oracle Clinical, O/s: UNIX,

 

·         Worked with SAS Enterprise Guide, Base SAS and Base Macro

·         Worked with Clinical reporting teams for protocol based reporting and related data management processing.

 

 

 

 

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Solutions Architect

Mahindra Satyam

- Present

 

Additional Info

BACK TO TOP

 

Desired Salary/Wage:

100,000.00 - 130,000.00 USD yr

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

From 1 to 3 months

Work Status:

US - I require sponsorship to work in this country.

Active Security Clearance:

None

US Military Service:

Citizenship:

Other

 

 

Target Job:

Target Job Title:

Solutions Architect

Desired Job Type:

Employee

Desired Status:

Full-Time

 

Target Company:

Company Size:

Industry:

Computer/IT Services

Occupation:

IT/Software Development

·         Enterprise Software Implementation & Consulting

 

Target Locations:

Selected Locations:

US

Relocate:

Yes

Willingness to travel:

Up to 100%

 

Languages:

Languages

Proficiency Level

English

Beginner

Hindi

Beginner

Marathi

Beginner